Strong Universal Pointwise Consistency of Some Regression Function Estimates

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strong Universal Consistency of Smooth Kernel Regression Estimates

The paper deals with kernel estimates of Nadaraya-Watson type for a regression function with square integrable response variable. For usual bandwidth sequences and smooth nonnegative kernels, e.g., Gaussian and quartic kernels, strong L2-consistency is shown without any further condition on the underlying distribution. The proof uses a Tauberian theorem for Ces~ro summability. Let X be a d-dime...

متن کامل

Strong consistency of least-squares estimates in regression models.

A general theorem on the limiting behavior of certain weighted sums of i.i.d. random variables is obtained. This theorem is then applied to prove the strong consistency of least-squares estimates in linear and nonlinear regression models with i.i.d. errors under minimal assumptions on the design and weak moment conditions on the errors.

متن کامل

Strong Consistency of Kernel Regression Estimate

In this paper, regression function estimation from independent and identically distributed data is considered. We establish strong pointwise consistency of the famous Nadaraya-Watson estimator under weaker conditions which permit to apply kernels with unbounded support and even not integrable ones and provide a general approach for constructing strongly consistent kernel estimates of regression...

متن کامل

ON THE STRONG UNIVERSAL CONSISTENCY OF NEAREST NEIGHBOR REGRESSION FUNCTION ESTIMATESI BY LUC DEVROYE, LAszLO GYORFI, ADAM KRZYZAK AND GABOR

m(x) = Wni(x ; X1 , . . . , Xn)Yi, i 1 and Wni(x; X1 , . . . ,Xn) is 1/k ifXi is one of the k nearest neighbors of x among X1, . . . , Xn , and Wn i is zero otherwise . Note in particular that I' 1 W,1 = 1. The k-nearest neighbor estimate was studied by Cover (1968) . For a survey of other estimates, see, for example, Collomb (1981, 1985) or Gyorfi (1981) . We are concerned with the L1 converge...

متن کامل

Pointwise Tracking the Optimal Regression Function

This paper examines the possibility of a ‘reject option’ in the context of least squares regression. It is shown that using rejection it is theoretically possible to learn ‘selective’ regressors that can ǫ-pointwise track the best regressor in hindsight from the same hypothesis class, while rejecting only a bounded portion of the domain. Moreover, the rejected volume vanishes with the training ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Multivariate Analysis

سال: 1999

ISSN: 0047-259X

DOI: 10.1006/jmva.1999.1836